In this paper, feedforward neural networks are presented that have nonlinearweight functions based on look--up tables, that are specially smoothed in aregularization called the diffusion. The idea of such a type of networks isbased on the hypothesis that the greater number of adaptive parameters per aweight function might reduce the total number of the weight functions needed tosolve a given problem. Then, if the computational complexity of a propagationthrough a single such a weight function would be kept low, then the introducedneural networks might possibly be relatively fast. A number of tests is performed, showing that the presented neural networksmay indeed perform better in some cases than the classic neural networks and anumber of other learning machines.
展开▼